Learning of Structurally Unambiguous Probabilistic Grammars

نویسندگان

چکیده

The problem of identifying a probabilistic context free grammar has two aspects: the first is determining grammar's topology (the rules grammar) and second estimating weights for each rule. Given hardness results learning context-free grammars in general, particular, most literature concentrated on problem. In this work we address We restrict attention to structurally unambiguous weighted (SUWCFG) provide query algorithm strucuturally (SUPCFG). show that SUWCFG can be represented using co-linear multiplicity tree automata (CMTA), polynomial learns CMTAs. learned CMTA converted into grammar, thus providing complete strucutrally (both weights) structured membership queries equivalence queries. demonstrate usefulness our PCFGs over genomic data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Structurally Unambiguous Finite Automata

We define a structurally unambiguous finite automaton (SUFA) to be a nondeterministic finite automaton (NFA) with one starting state q0 such that for all input strings w and for any state q, there is at most one path from q0 to q that consumes w. The definition of SUFA differs from the usual definition of an unambiguous finite automaton (UFA) in that the new definition is defined in terms of th...

متن کامل

Unsupervised learning of probabilistic grammars

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi CHAPTER

متن کامل

Unambiguous Boolean grammars

Boolean grammars are an extension of context-free grammars, in which all propositional connectives are allowed. In this paper, the notion of ambiguity in Boolean grammars is defined. It is shown that the known transformation of a Boolean grammar to the binary normal form preserves unambiguity, and that every unambiguous Boolean language can be parsed in time O(n). Linear conjunctive languages a...

متن کامل

Learning restricted probabilistic link grammars

We describe a language model employing a new headeddisjuncts formulationof Lafferty et al.’s (1992)probabilistic link grammar, together with (1) an EM training method for estimating the probabilities, and (2) a procedure for learning some simple lexicalized grammar structures. The model in its simplest form is a generalization of n-gram models, but in its general form possesses context-free exp...

متن کامل

Covariance in Unsupervised Learning of Probabilistic Grammars

Probabilistic grammars offer great flexibility in modeling discrete sequential data like natural language text. Their symbolic component is amenable to inspection by humans, while their probabilistic component helps resolve ambiguity. They also permit the use of well-understood, generalpurpose learning algorithms. There has been an increased interest in using probabilistic grammars in the Bayes...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i10.17107